Expectation-Maximization-Maximization: A Feasible MLE Algorithm for the Three-Parameter Logistic Model Based on a Mixture Modeling Reformulation

نویسندگان

  • Chanjin Zheng
  • Xiangbin Meng
  • Shaoyang Guo
  • Zhengguang Liu
چکیده

Stable maximum likelihood estimation (MLE) of item parameters in 3PLM with a modest sample size remains a challenge. The current study presents a mixture-modeling approach to 3PLM based on which a feasible Expectation-Maximization-Maximization (EMM) MLE algorithm is proposed. The simulation study indicates that EMM is comparable to the Bayesian EM in terms of bias and RMSE. EMM also produces smaller standard errors (SEs) than MMLE/EM. In order to further demonstrate the feasibility, the method has also been applied to two real-world data sets. The point estimates in EMM are close to those from the commercial programs, BILOG-MG and flexMIRT, but the SEs are smaller.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Technical Details about the Expectation Maximization (EM) Algorithm

P(X|θ) is the (observable) data likelihood. The parameter θ is omitted sometimes for simple notation. MLE is normally done by taking the derivative of the data likelihood P(X) with respect to the model parameter θ and solving the equation. However, in some cases where we have hidden (unobserved) variables in the model, the derivative w.r.t. the model parameter does not have a close form solutio...

متن کامل

Online k-MLE for Mixture Modeling with Exponential Families

This paper address the problem of online learning finite statistical mixtures of exponential families. A short review of the Expectation-Maximization (EM) algorithm and its online extensions is done. From these extensions and the description of the k-Maximum Likelihood Estimator (k-MLE), three online extensions are proposed for this latter. To illustrate them, we consider the case of mixtures o...

متن کامل

Fast Learning of Gamma Mixture Models with k-MLE

We introduce a novel algorithm to learn mixtures of Gamma distributions. This is an extension of the k-Maximum Likelihood estimator algorithm for mixtures of exponential families. Although Gamma distributions are exponential families, we cannot rely directly on the exponential families tools due to the lack of closed-form formula and the cost of numerical approximation: our method uses Gamma di...

متن کامل

Consistency, Breakdown Robustness, and Algorithms for Robust Improper Maximum Likelihood Clustering

The robust improper maximum likelihood estimator (RIMLE) is a new method for robust multivariate clustering finding approximately Gaussian clusters. It maximizes a pseudolikelihood defined by adding a component with improper constant density for accommodating outliers to a Gaussian mixture. A special case of the RIMLE is MLE for multivariate finite Gaussian mixture models. In this paper we trea...

متن کامل

Fuzzy Class Logistic Regression Analysis

Distribution mixtures are used as models to analyze grouped data. The estimation of parameters is an important step for mixture distributions. The latent class model is generally used as the analysis of mixture distributions for discrete data. In this paper, we consider the parameter estimation for a mixture of logistic regression models. We know that the expectation maximization (EM) algorithm...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره 8  شماره 

صفحات  -

تاریخ انتشار 2017